Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
3B Parameter Optimization
# 3B Parameter Optimization
Ministral 3b Instruct
Apache-2.0
Ministral is a small-scale language model series based on the Mistral architecture, with a parameter size of 3 billion, primarily designed for English text generation tasks.
Large Language Model
Transformers
English
M
ministral
15.89k
53
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase